PyDigger - unearthing stuff about Python


NameVersionSummarydate
flash-attn 2.5.9.post1 Flash Attention: Fast and Memory-Efficient Exact Attention 2024-05-27 05:19:16
causal-conv1d 1.2.2.post1 Causal depthwise conv1d in CUDA, with a PyTorch interface 2024-05-26 19:53:08
quant-matmul 1.2.0 Quantized MatMul in CUDA with a PyTorch interface 2024-03-20 03:44:36
fast-hadamard-transform 1.0.4.post1 Fast Hadamard Transform in CUDA, with a PyTorch interface 2024-02-13 05:49:17
flash-attn-wheels-test 2.0.8.post17 Flash Attention: Fast and Memory-Efficient Exact Attention 2023-08-13 21:27:09
flash-attn-xwyzsn 1.0.7 Flash Attention: Fast and Memory-Efficient Exact Attention 2023-06-01 03:53:40
Tri Dao
hourdayweektotal
3111609321218940
Elapsed time: 0.66669s